MULTIBOOST: A Multi-purpose Boosting Package

نویسندگان

  • Djalel Benbouzid
  • Róbert Busa-Fekete
  • Norman Casagrande
  • François-David Collin
  • Balázs Kégl
چکیده

The MULTIBOOST package provides a fast C++ implementation of multi-class/multi-label/multitask boosting algorithms. It is based on ADABOOST.MH but it also implements popular cascade classifiers and FILTERBOOST. The package contains common multi-class base learners (stumps, trees, products, Haar filters). Further base learners and strong learners following the boosting paradigm can be easily implemented in a flexible framework.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

ADABOOK & MULTIBOOK Adaptive Boosting with Chance Correction

There has been considerable interest in boosting and bagging, including the combination of the adaptive techniques of AdaBoost with the random selection with replacement techniques of Bagging. At the same time there has been a revisiting of the way we evaluate, with chance-corrected measures like Kappa, Informedness, Correlation or ROC AUC being advocated. This leads to the question of whether ...

متن کامل

Adaptive Boosting with Chance Correction

There has been considerable interest in boosting and bagging, including the combination of the adaptive techniques of AdaBoost with the random selection with replacement techniques of Bagging. At the same time there has been a revisiting of the way we evaluate, with chance-corrected measures like Kappa, Informedness, Correlation or ROC AUC being advocated. This leads to the question of whether ...

متن کامل

Fast Training of Effective Multi-class Boosting Using Coordinate Descent Optimization

We present a novel column generation based boosting method for multi-class classification. Our multi-class boosting is formulated in a single optimization problem as in [1, 2]. Different from most existing multi-class boosting methods, which use the same set of weak learners for all the classes, we train class specified weak learners (i.e., each class has a different set of weak learners). We s...

متن کامل

A Scalable and Parallel Implementation of the MultiBoost Library’s AdaBoost.MH (Adaptive Boosting) Algorithm

This report explores a parallel implementation of the C++ MultiBoost open source library’s AdaBoost.MH (Adaptive Boosting) algorithm (D. Benbouzid,2012), and the changes in performance as measured on the Texas Advanced Computing Center’s Stampede supercomputer. This parallel implementation is based on the AdaBoost.PL Algorithm (Palit & Reddy,2012). The baseline is the AdaBoost.MH algorithm, and...

متن کامل

ada: An R Package for Stochastic Boosting

Boosting is an iterative algorithm that combines simple classification rules with ‘mediocre’ performance in terms of misclassification error rate to produce a highly accurate classification rule. Stochastic gradient boosting provides an enhancement which incorporates a random mechanism at each boosting step showing an improvement in performance and speed in generating the ensemble. ada is an R ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Journal of Machine Learning Research

دوره 13  شماره 

صفحات  -

تاریخ انتشار 2012